22 research outputs found

    jasonSWIR Calibration of Spectralon Reflectance Factor

    Get PDF
    Satellite instruments operating in the reflective solar wavelength region require accurate and precise determination of the Bidirectional Reflectance Factor (BRF) of laboratory-based diffusers used in their pre-flight and on-orbit radiometric calibrations. BRF measurements are required throughout the reflected-solar spectrum from the ultraviolet through the shortwave infrared. Spectralon diffusers are commonly used as a reflectance standard for bidirectional and hemispherical geometries. The Diffuser Calibration Laboratory (DCaL) at NASA's Goddard Space Flight Center is a secondary calibration facility with reflectance measurements traceable to those made by the Spectral Tri-function Automated Reference Reflectometer (STARR) facility at the National Institute of Standards and Technology (NIST). For more than two decades, the DCaL has provided numerous NASA projects with BRF data in the ultraviolet (UV), visible (VIS) and the Near infraRed (NIR) spectral regions. Presented in this paper are measurements of BRF from 1475nm to 1625nm obtained using an indium gallium arsenide detector and a tunable coherent light source. The sample was a 2 inch diameter, 99% white Spectralon target. The BRF results are discussed and compared to empirically generated data from a model based on NIST certified values of 6deg directional/hemispherical spectral reflectance factors from 900nm to 2500nm. Employing a new NIST capability for measuring bidirectional reflectance using a cooled, extended InGaAs detector, BRF calibration measurements of the same sample were also made using NIST's STARR from 1475nm to 1625nm at an incident angle of 0deg and at viewing angles of 40deg, 45deg, and 50deg. The total combined uncertainty for BRF in this ShortWave Infrared (SWIR) range is less than 1%. This measurement capability will evolve into a BRF calibration service in SWIR region in support of NASA remote sensing missions. Keywords: BRF, BRDF, Calibration, Spectralon, Reflectance, Remote Sensing

    The Effect of Incident Light Polarization on Vegetation Bidirectional Reflectance Factor

    Get PDF
    The Laboratory-based Bidirectional Reflectance Factor (BRF) polarization study of vegetation is presented in this paper. The BRF was measured using a short-arc Xenon lamp/monochromator assembly producing an incoherent, tunable light source with a well-defined spectral bandpass at visible and near-infrared wavelengths of interest at 470 nm and 870 nm and coherent light source at 1.656 microns. All vegetation samples were measured using P and S linearly polarized incident light over a range of incident and scatter angles. By comparing these results, we quantitatively examine how the BRF of the samples depends on the polarization of the incident light. The differences are significant, depend strongly on the incident and scatter angles, and can be as high as 120% at 67 deg incident and 470nm. The global nature of Earth's processes requires consistent long-term calibration of all instruments involved in data retrieval. The BRF defines the reflection characteristics of Earth surface. It provides the reflectance of a target in a specific direction as a function of illumination and viewing geometry. The BRF is a function of wavelength and reflects the structural and optical properties of the surface. Various space and airborne radiometric and imaging remote sensing instruments are used in the remote sensing characterization of vegetation canopies and soils, oceans, or especially large pollution sources. The satellite data is validated through comparison with airborne, ground-based and laboratory-based data in an effort to fully understand the vegetation canopy reflectance, The Sun's light is assumed to be unpolarized at the top of the atmosphere; however it becomes polarized to some degree due to atmospheric effects by the time it reaches the vegetation canopy. Although there are numerous atmospheric correction models, laboratory data is needed for model verification and improvement

    Directional Reflectance Studies in Support of the Radiometric Calibration Test Site (RadCaTS) at Railroad Valley

    Get PDF
    The Radiometric Calibration Test Site (RadCaTS) is a suite of commercial and custom instruments used to make measurements of the surface reflectance and atmosphere throughout the day at Railroad Valley, Nevada. It was developed in response to the need for daily radiometric calibration data for the vast array of Earth-observing sensors on orbit, which is continuously increasing as more nations and private companies launch individual environmental satellites as well as large constellations. The current suite of instruments at RadCaTS includes five ground-viewing radiometers (GVRs), four of which view the surface in a nadir-viewing configuration. Many sensors such as those on Landsat-7 and Landsat-8 view Railroad Valley within 3 of nadir, while others such as those on Sentinel-2A and -2B, RapidEye, Aqua, Suomi NPP, and Terra can view Railroad Valley at off-nadir angles. Past efforts have shown that the surface bidirectional reflectance distribution function (BRDF) has minimal impact on vicarious calibration uncertainties for views <10, but the desire to use larger view angles has prompted the effort to develop a BRDF correction for data from RadCaTS. The current work investigates the application of Railroad Valley BRDF data derived from a BRF camera developed at the University of Arizona in the 1990s (but is no longer in use) to the current RadCaTS surface reflectance measurements. Also investigated are early results from directional reflectance studies using a mobile spectro-goniometer system during a round-robin field campaign in 2018. This work describes the preliminary results, the effects on current measurements, and the approach for future measurements

    Ultra-Portable Field Transfer Radiometer for Vicarious Calibration of Earth Imaging Sensors

    Get PDF
    A small portable transfer radiometer has been developed as part of an effort to ensure the quality of upwelling radiance from test sites used for vicarious calibration in the solar reflective. The test sites are used to predict top-of-atmosphere reflectance relying on ground-based measurements of the atmosphere and surface. The portable transfer radiometer is designed for one-person operation for on-site field calibration of instrumentation used to determine ground-leaving radiance. The current work describes the detector-and source-based radiometric calibration of the transfer radiometer highlighting the expected accuracy and SI-traceability. The results indicate differences between the detector-based and source-based results greater than the combined uncertainties of the approaches. Results from recent field deployments of the transfer radiometer using a solar radiation based calibration agree with the source-based laboratory calibration within the combined uncertainties of the methods. The detector-based results show a significant difference to the solar-based calibration. The source-based calibration is used as the basis for a radiance-based calibration of the Landsat-8 Operational Land Imager that agrees with the OLI calibration to within the uncertainties of the methods

    Landsat-7 ETM+ Radiometric Stability and Absolute Calibration

    Get PDF
    Launched in April 1999, the Landsat-7 ETM+ instrument is in its fourth year of operation. The quality of the acquired calibrated imagery continues to be high, especially with respect to its three most important radiometric performance parameters: reflective band instrument stability to better than ±1%, reflective band absolute calibration to better than ±5%, and thermal band absolute calibration to better than ± 0.6 K. The ETM+ instrument has been the most stable of any of the Landsat instruments, in both the reflective and thermal channels. To date, the best on-board calibration source for the reflective bands has been the Full Aperture Solar Calibrator, which has indicated changes of at most –1.8% to –2.0% (95% C.I.) change per year in the ETM+ gain (band 4). However, this change is believed to be caused by changes in the solar diffuser panel, as opposed to a change in the instrument\u27s gain. This belief is based partially on ground observations, which bound the changes in gain in band 4 at –0.7% to +1.5%. Also, ETM+ stability is indicated by the monitoring of desert targets. These image-based results for four Saharan and Arabian sites, for a collection of 35 scenes over the three years since launch, bound the gain change at –0.7% to +0.5% in band 4. Thermal calibration from ground observations revealed an offset error of +0.31 W/m2 sr um soon after launch. This offset was corrected within the U. S. ground processing system at EROS Data Center on 21-Dec-00, and since then, the band 6 on-board calibration has indicated changes of at most +0.02% to +0.04% (95% C.I.) per year. The latest ground observations have detected no remaining offset error with an RMS error of ± 0.6 K. The stability and absolute calibration of the Landsat-7 ETM+ sensor make it an ideal candidate to be used as a reference source for radiometric cross-calibrating to other land remote sensing satellite systems

    Landsat-7 ETM+: 12 years On-Orbit Reflective-Band Radiometric Performance

    Get PDF
    The Landsat-7 ETM+ sensor has been operating on orbit for more than 12 years and characterizations of its performance have been ongoing over this period. In general, the radiometric performance of the instrument has been remarkably stable: (1) Noise performance has degraded by 2% or less overall, with a few detectors displaying step changes in noise of 2% or less, (2) Coherent noise frequencies and magnitudes have generally been stable, though the within-scan amplitude variation of the 20kHz noise in bands 1 and 8 disappeared with the failure of the scan line corrector and a new similar frequency noise (now about 18kHz) has appeared in two detectors in band 5 and increased in magnitude with time, (3) Bias stability has been better than 0.25 DN out of a normal value of 15 DN in high gain, (4) Relative gains, the differences in response between the detectors in the band, have generally changed by 0.1% or less over the mission, with the exception of a few detectors with a step response change of 1% or less and (5) Gain stability averaged across all detectors in a band, which is related to the stability of the absolute calibration, has been more stable than the techniques used to measure it. Due to the inability to confirm changes in the gain (beyond a few detectors that have been corrected back to the band average), ETM+ reflective band data continues to be calibrated with the pre-launch measured gains. In the worst case some bands may have changed as much as 2% in uncompensated absolute calibration over the 12 years

    Spectral Analysis of the Primary Flight Focal Plane Arrays for the Thermal Infrared Sensor

    Get PDF
    Thermal Infrared Sensor (TIRS) is a (1) New longwave infrared (10 - 12 micron) sensor for the Landsat Data Continuity Mission, (2) 185 km ground swath; 100 meter pixel size on ground, (3) Pushbroom sensor configuration. Issue of Calibration are: (1) Single detector -- only one calibration, (2) Multiple detectors - unique calibration for each detector -- leads to pixel-to-pixel artifacts. Objectives are: (1) Predict extent of residual striping when viewing a uniform blackbody target through various atmospheres, (2) Determine how different spectral shapes affect the derived surface temperature in a realistic synthetic scene
    corecore